Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
This research is about addressing the need to better understand interaction with conversational user...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
Abstract. The promises of multimodal interaction to make interaction more natural, less error-prone ...
International audienceThe promises of multimodal interaction to make interaction more natural, less ...
Abstract- The physically impaired users cannot handle the traditional input devices such as keyboard...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
This research is about addressing the need to better understand interaction with conversational user...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
Abstract. The promises of multimodal interaction to make interaction more natural, less error-prone ...
International audienceThe promises of multimodal interaction to make interaction more natural, less ...
Abstract- The physically impaired users cannot handle the traditional input devices such as keyboard...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...